Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement simulation object validation checks #94

Conversation

rmoretti9
Copy link
Contributor

This pull request addresses issue #91

Copy link

cla-bot bot commented May 30, 2024

Thank you for your pull request and welcome to our community.

We require contributors to sign our Contributor License Agreement (CLA), and it seems that the following contributors have not yet signed it: @rmoretti9.

In order for us to review and merge your code, please sign the CLA at meetiqm.com.

Once you have signed the CLA, write a comment here and we'll verify. Please note that it may take some time for us to verify it.

@rmoretti9
Copy link
Contributor Author

I have now signed the CLA

@qpavsmi
Copy link
Contributor

qpavsmi commented May 30, 2024

Oh hi :D Thanks for submission! It will take me some time to receive confirmation that you CLA form went through but in the mean time I will go through the code and test that it works

Copy link

cla-bot bot commented May 30, 2024

Thank you for your pull request and welcome to our community.

We require contributors to sign our Contributor License Agreement (CLA), and it seems that the following contributors have not yet signed it: @rmoretti9.

In order for us to review and merge your code, please sign the CLA at meetiqm.com.

Once you have signed the CLA, write a comment here and we'll verify. Please note that it may take some time for us to verify it.

@rmoretti9
Copy link
Contributor Author

I introduced the function validate_simulation inside klayout_package/python/kqcircuits/simulations/export/simulation_export.py.
So that both export_elmer and export_ansys perform some checks before doing any other operations. For the moment, I implemented the checks suggested in the Concrete validation scenarios to start with part of the issue description and tested it within the script "klayout_package/python/scripts/simulations/swissmon_fluxline_sim.py" by commenting different parts of SwissmonFluxlineSim to trigger the error messages.

I'm currently trying to figure out how to simplify the introduction of new validation checks, and making them less prone to nested if statements. For instance, port_and_simulation_type_compatibility and flux_integration_layer_exists_if_needed are fairly simple functions but perhaps not that intuitive at first.

@qpavsmi
Copy link
Contributor

qpavsmi commented May 30, 2024

@cla-bot check

@cla-bot cla-bot bot added the cla-signed label May 30, 2024
Copy link

cla-bot bot commented May 30, 2024

The cla-bot has been summoned, and re-checked this pull request!

Copy link
Contributor

@qpavsmi qpavsmi left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would recommend to change the function signature of validate_simulation so that instead of taking a list of Simulations, it takes a Simulation and a Solution object, and handle them explicitly instead of addressing them by simulation[0] and simulation[1]. You can see for example in elmer_export.py if it detects that there is no Solution object paired with Simulation, then it will construct a Solution object from solution_parameters. But at that point we have already run validate_simulation. So maybe validate_simulation could instead be invoked at the beginning of the for sim_sol in simulations when we have explicitly defined simulation and solution.

port_and_simulation_type_compatibility could probably be separated into two cases: "If no ports, solution cant be X" and "if edgeport, solution cant be X". It's good that these validation functions have single responsibility, and its better to get a bigger list of clearly defined checks than a small list of checks that do subchecks. One ValueError return per function is probably a good rule of thumb. The check functions could also be separated into some distinct new file so they are more easily maintained.

You could maybe introduce a custom exception type instead of using ValueError, we might treat these errors somehow differently from generic errors in the future.

We would also like some unit tests for these checks. Since the checks don't rely on any real geometry but only on ports, you could probably mock a Simulation object without a Cell and just insert arbitrary ports in there.

@rmoretti9
Copy link
Contributor Author

Thank you @qpavsmi, I think I implemented your suggestions now, but I'm a bit confused about the unit tests since I have very little experience with them. I made a test module in KQCircuits\tests\simulations\simulation_validate\test_simulation_validate.py but I'm not sure whether this works properly or not. When I run pytest --cov --cov-report term-missing -rf this test fails for Empty Simulations because it tries to match it with AnsysHfssSolution

@qpavsmi
Copy link
Contributor

qpavsmi commented Jun 3, 2024

Thank you @qpavsmi, I think I implemented your suggestions now, but I'm a bit confused about the unit tests since I have very little experience with them. I made a test module in KQCircuits\tests\simulations\simulation_validate\test_simulation_validate.py but I'm not sure whether this works properly or not. When I run pytest --cov --cov-report term-missing -rf this test fails for Empty Simulations because it tries to match it with AnsysHfssSolution

Took me some time to realise this, but it's actually not your test failing, but your pull request breaks one of the existing tests: test_export_produces_output_files. It creates a completely empty simulation (no ports either), and checks that it can be exported without causing errors. Now if solution type is not specified, it defaults to AnsysHfssSolution. But now you validate that there must be a port if such solution type is used, hence the test fails.

It's best to modify this test so that it sets the solution type to something that passes this check. I found one way to fix this test is to fix the fixtures we defined for simulation tests. Notice that in the failing unit test we give perform_test_ansys_export_produces_output_files as an argument and then call it like a function. Where does that come from? Well in tests/conftest.py we define the fixture which is a function that returns another function perform_test_ansys_export_produces_output_implementation.

As this is getting very complicated, I'll just say exactly what to do to fix this test. We would like to be able to set which solution type to use by calling perform_test_ansys_export_produces_output_files(EmptySimulation, ansys_solution=AnsysSolution()) in test_export_produces_output_files. To do that, we modify the fixture in tests/conf.py. The fixture itself takes other fixtures as arguments: tmp_path is a built-in fixture to get a path to a temporary directory, and get_simulation is defined above. But we should instead introduce the new ansys_solution argument into the inner function perform_test_ansys_export_produces_output_implementation that the fixture returns. So that piece of code would look something like:

@pytest.fixture
def perform_test_ansys_export_produces_output_files(tmp_path, get_simulation):
    def perform_test_ansys_export_produces_output_implementation(cls, ansys_solution=None, **parameters):
        simulation = get_simulation(cls, **parameters)
        bat_filename = export_ansys([(simulation, ansys_solution) if ansys_solution else simulation], path=tmp_path)
        assert Path(bat_filename).exists()

    return perform_test_ansys_export_produces_output_implementation

After this change the test should pass. Consider making similar change for existing elmer test fixtures that we have.

I would recommend to study the fixtures since that could help you structure the tests you are writing better. Generally its good practice that each test function is written in such a way that they can be independently run, which means they set everything needed for the test (and tear it down if needed). Of course it gets annoying when the same set up is copy pasted for each unit test. This is where fixtures are useful, and pytest makes sure that the fixture is reconstructed from scratch for every unit test so that they are not made dependent on their order of execution. So in your example the initialisation of MockSimulation could be made into a fixture.

You will also need to test for negative cases: with this simulation and that solution I expect it to fail. You can do this using with pytest.raises(ValidateSimError) as e: block, google for examples or see test_chip_coordinate_conversion.py

@qpavsmi
Copy link
Contributor

qpavsmi commented Jun 6, 2024

Remember to run python -m black -l 120 -t py39 . once you are close to being done

@rmoretti9
Copy link
Contributor Author

Hi @qpavsmi, thanks for the useful suggestion. I fixed the perform_test_ansys_export_produces_output_files fixture and implemented a few tests (two for every sim validation check). Let me know if I can improve it further

@qpavsmi
Copy link
Contributor

qpavsmi commented Jun 10, 2024

I think after addressing this round of suggestions we can approve and get this merged :)

@rmoretti9 rmoretti9 marked this pull request as ready for review June 10, 2024 10:23
@rmoretti9 rmoretti9 force-pushed the 91-validate-and-report-misconfigured-simulation-export branch from 161af97 to c3fe9b2 Compare June 10, 2024 10:32
@rmoretti9
Copy link
Contributor Author

I think after addressing this round of suggestions we can approve and get this merged :)

Thanks @qpavsmi, I squashed my commits and marked this Pull Request as ready for review

@qpavsmi
Copy link
Contributor

qpavsmi commented Jun 10, 2024

Thanks @qpavsmi, I squashed my commits and marked this Pull Request as ready for review

I'm not sure if its visible to you but there are many pending review points that I outlined based on the code changes
image

These would be good to address before we merge

@rmoretti9 rmoretti9 force-pushed the 91-validate-and-report-misconfigured-simulation-export branch from c3fe9b2 to 97fd20c Compare June 10, 2024 15:56
@rmoretti9
Copy link
Contributor Author

Thanks @qpavsmi, I squashed my commits and marked this Pull Request as ready for review

I'm not sure if its visible to you but there are many pending review points that I outlined based on the code changes image

These would be good to address before we merge

It seems like I cannot access these comments (or I don't know how to do that), but I modified the code based on the screenshot's suggestion. Except for the one about adding the empty new line. I'm not sure where it should be added.

Copy link
Contributor

@qpavsmi qpavsmi left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should be visible now

@qpavsmi
Copy link
Contributor

qpavsmi commented Jun 10, 2024

It seems like I cannot access these comments (or I don't know how to do that), but I modified the code based on the screenshot's suggestion. Except for the one about adding the empty new line. I'm not sure where it should be added.

You should be able to see the suggestions now I think: #94 (review)

@rmoretti9 rmoretti9 force-pushed the 91-validate-and-report-misconfigured-simulation-export branch from 97fd20c to f2dbe58 Compare June 11, 2024 10:31
@rmoretti9
Copy link
Contributor Author

I applied all the suggested changes. Maybe the usage of @pytest.mark.parametrize in tests is a bit redundant as it is now, but it allows adding new tests quite easily

qpavsmi
qpavsmi previously approved these changes Jun 11, 2024
Copy link
Contributor

@qpavsmi qpavsmi left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for sticking through the review process :D Implementation is very clear now and the unit tests are comprehensive. Approving now, will merge after I have run some local tests.

@rmoretti9 rmoretti9 force-pushed the 91-validate-and-report-misconfigured-simulation-export branch from f2dbe58 to 3124abb Compare June 11, 2024 12:08
qpavsmi
qpavsmi previously approved these changes Jun 11, 2024
Copy link
Contributor

@qpavsmi qpavsmi left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Approving again after fixing linter issues

@rmoretti9 rmoretti9 force-pushed the 91-validate-and-report-misconfigured-simulation-export branch from 3124abb to 3b372c7 Compare June 11, 2024 12:21
@qpavsmi qpavsmi merged commit 93c4176 into iqm-finland:main Jun 11, 2024
19 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants